393 research outputs found

    Decoherence in a system of many two--level atoms

    Full text link
    I show that the decoherence in a system of NN degenerate two--level atoms interacting with a bosonic heat bath is for any number of atoms NN governed by a generalized Hamming distance (called ``decoherence metric'') between the superposed quantum states, with a time--dependent metric tensor that is specific for the heat bath.The decoherence metric allows for the complete characterization of the decoherence of all possible superpositions of many-particle states, and can be applied to minimize the over-all decoherence in a quantum memory. For qubits which are far apart, the decoherence is given by a function describing single-qubit decoherence times the standard Hamming distance. I apply the theory to cold atoms in an optical lattice interacting with black body radiation.Comment: replaced with published versio

    Interest Rates and Information Geometry

    Full text link
    The space of probability distributions on a given sample space possesses natural geometric properties. For example, in the case of a smooth parametric family of probability distributions on the real line, the parameter space has a Riemannian structure induced by the embedding of the family into the Hilbert space of square-integrable functions, and is characterised by the Fisher-Rao metric. In the nonparametric case the relevant geometry is determined by the spherical distance function of Bhattacharyya. In the context of term structure modelling, we show that minus the derivative of the discount function with respect to the maturity date gives rise to a probability density. This follows as a consequence of the positivity of interest rates. Therefore, by mapping the density functions associated with a given family of term structures to Hilbert space, the resulting metrical geometry can be used to analyse the relationship of yield curves to one another. We show that the general arbitrage-free yield curve dynamics can be represented as a process taking values in the convex space of smooth density functions on the positive real line. It follows that the theory of interest rate dynamics can be represented by a class of processes in Hilbert space. We also derive the dynamics for the central moments associated with the distribution determined by the yield curve.Comment: 20 pages, 3 figure

    Detection of trend changes in time series using Bayesian inference

    Full text link
    Change points in time series are perceived as isolated singularities where two regular trends of a given signal do not match. The detection of such transitions is of fundamental interest for the understanding of the system's internal dynamics. In practice observational noise makes it difficult to detect such change points in time series. In this work we elaborate a Bayesian method to estimate the location of the singularities and to produce some confidence intervals. We validate the ability and sensitivity of our inference method by estimating change points of synthetic data sets. As an application we use our algorithm to analyze the annual flow volume of the Nile River at Aswan from 1871 to 1970, where we confirm a well-established significant transition point within the time series.Comment: 9 pages, 12 figures, submitte

    A New Approach to Time Domain Classification of Broadband Noise in Gravitational Wave Data

    Get PDF
    Broadband noise in gravitational wave (GW) detectors, also known as triggers, can often be a deterrant to the efficiency with which astrophysical search pipelines detect sources. It is important to understand their instrumental or environmental origin so that they could be eliminated or accounted for in the data. Since the number of triggers is large, data mining approaches such as clustering and classification are useful tools for this task. Classification of triggers based on a handful of discrete properties has been done in the past. A rich information content is available in the waveform or 'shape' of the triggers that has had a rather restricted exploration so far. This paper presents a new way to classify triggers deriving information from both trigger waveforms as well as their discrete physical properties using a sequential combination of the Longest Common Sub-Sequence (LCSS) and LCSS coupled with Fast Time Series Evaluation (FTSE) for waveform classification and the multidimensional hierarchical classification (MHC) analysis for the grouping based on physical properties. A generalized k-means algorithm is used with the LCSS (and LCSS+FTSE) for clustering the triggers using a validity measure to determine the correct number of clusters in absence of any prior knowledge. The results have been demonstrated by simulations and by application to a segment of real LIGO data from the sixth science run.Comment: 16 pages, 16 figure

    Fast parameter inference in a biomechanical model of the left ventricle by using statistical emulation

    Get PDF
    A central problem in biomechanical studies of personalized human left ventricular modelling is estimating the material properties and biophysical parameters from in vivo clinical measurements in a timeframe that is suitable for use within a clinic. Understanding these properties can provide insight into heart function or dysfunction and help to inform personalized medicine. However, finding a solution to the differential equations which mathematically describe the kinematics and dynamics of the myocardium through numerical integration can be computationally expensive. To circumvent this issue, we use the concept of emulation to infer the myocardial properties of a healthy volunteer in a viable clinical timeframe by using in vivo magnetic resonance image data. Emulation methods avoid computationally expensive simulations from the left ventricular model by replacing the biomechanical model, which is defined in terms of explicit partial differential equations, with a surrogate model inferred from simulations generated before the arrival of a patient, vastly improving computational efficiency at the clinic. We compare and contrast two emulation strategies: emulation of the computational model outputs and emulation of the loss between the observed patient data and the computational model outputs. These strategies are tested with two interpolation methods, as well as two loss functions. The best combination of methods is found by comparing the accuracy of parameter inference on simulated data for each combination. This combination, using the output emulation method, with local Gaussian process interpolation and the Euclidean loss function, provides accurate parameter inference in both simulated and clinical data, with a reduction in the computational cost of about three orders of magnitude compared with numerical integration of the differential equations by using finite element discretization techniques

    On finite pp-groups whose automorphisms are all central

    Full text link
    An automorphism α\alpha of a group GG is said to be central if α\alpha commutes with every inner automorphism of GG. We construct a family of non-special finite pp-groups having abelian automorphism groups. These groups provide counter examples to a conjecture of A. Mahalanobis [Israel J. Math., {\bf 165} (2008), 161 - 187]. We also construct a family of finite pp-groups having non-abelian automorphism groups and all automorphisms central. This solves a problem of I. Malinowska [Advances in group theory, Aracne Editrice, Rome 2002, 111-127].Comment: 11 pages, Counter examples to a conjecture from [Israel J. Math., {\bf 165} (2008), 161 - 187]; This paper will appear in Israel J. Math. in 201

    A geometric approach to visualization of variability in functional data

    Get PDF
    We propose a new method for the construction and visualization of boxplot-type displays for functional data. We use a recent functional data analysis framework, based on a representation of functions called square-root slope functions, to decompose observed variation in functional data into three main components: amplitude, phase, and vertical translation. We then construct separate displays for each component, using the geometry and metric of each representation space, based on a novel definition of the median, the two quartiles, and extreme observations. The outlyingness of functional data is a very complex concept. Thus, we propose to identify outliers based on any of the three main components after decomposition. We provide a variety of visualization tools for the proposed boxplot-type displays including surface plots. We evaluate the proposed method using extensive simulations and then focus our attention on three real data applications including exploratory data analysis of sea surface temperature functions, electrocardiogram functions and growth curves

    Radio-loud Narrow-Line Type 1 Quasars

    Full text link
    We present the first systematic study of (non-radio-selected) radio-loud narrow-line Seyfert 1 (NLS1) galaxies. Cross-correlation of the `Catalogue of Quasars and Active Nuclei' with several radio and optical catalogues led to the identification of 11 radio-loud NLS1 candidates including 4 previously known ones. Most of the radio-loud NLS1s are compact, steep spectrum sources accreting close to, or above, the Eddington limit. The radio-loud NLS1s of our sample are remarkable in that they occupy a previously rarely populated regime in NLS1 multi-wavelength parameter space. While their [OIII]/H_beta and FeII/H_beta intensity ratios almost cover the whole range observed in NLS1 galaxies, their radio properties extend the range of radio-loud objects to those with small widths of the broad Balmer lines. Among the radio-detected NLS1 galaxies, the radio index R distributes quite smoothly up to the critical value of R ~ 10 and covers about 4 orders of magnitude in total. Statistics show that ~7% of the NLS1 galaxies are formally radio-loud while only 2.5% exceed a radio index R > 100. Several mechanisms are considered as explanations for the radio loudness of the NLS1 galaxies and for the lower frequency of radio-louds among NLS1s than quasars. While properties of most sources (with 2-3 exceptions) generally do not favor relativistic beaming, the combination of accretion mode and spin may explain the observations. (abbreviated)Comment: Astronomical Journal (first submitted in Dec. 2005); 45 pages incl. 1 colour figur

    An empirical goodness-of-fit test for multivariate distributions

    Get PDF
    An empirical test is presented as a tool for assessing whether a specified multivariate probability model is suitable to describe the underlying distribution of a set of observations. This test is based on the premise that, given any probability distribution, the Mahalanobis distances corresponding to data generated from that distribution will likewise follow a distinct distribution that can be estimated well by means of a large sample. We demonstrate the effectiveness of the test for detecting departures from several multivariate distributions. We then apply the test to a real multivariate data set to confirm that it is consistent with a multivariate beta model. © 2013 Copyright Taylor and Francis Group, LLC
    • …
    corecore